Tensor Q-rank: new data dependent definition of tensor rank
نویسندگان
چکیده
Recently, the $${ Tensor}~{ Nuclear}~{ Norm}~{ (TNN)}$$ regularization based on t-SVD has been widely used in various low tubal-rank tensor recovery tasks. However, these models usually require smooth change of data along third dimension to ensure their rank structures. In this paper, we propose a new definition dependent named Q-rank by learnable orthogonal matrix $$\mathbf {Q}$$ , and further introduce unified model. According hypothesis, two explainable selection methods under which may have more significant structure than that structure. Specifically, maximizing variance singular value distribution leads Variance Maximization Tensor Q-Nuclear norm (VMTQN), while minimizing nuclear through manifold optimization Manifold Optimization (MOTQN). Moreover, apply completion problem, then give an effective algorithm briefly analyze why our method works better TNN case complex with sampling rate. Finally, experimental results real-world datasets demonstrate superiority proposed problem respect other models.
منابع مشابه
Efficient tensor completion: Low-rank tensor train
This paper proposes a novel formulation of the tensor completion problem to impute missing entries of data represented by tensors. The formulation is introduced in terms of tensor train (TT) rank which can effectively capture global information of tensors thanks to its construction by a wellbalanced matricization scheme. Two algorithms are proposed to solve the corresponding tensor completion p...
متن کاملLow-rank Tensor Approximation
Approximating a tensor by another of lower rank is in general an ill posed problem. Yet, this kind of approximation is mandatory in the presence of measurement errors or noise. We show how tools recently developed in compressed sensing can be used to solve this problem. More precisely, a minimal angle between the columns of loading matrices allows to restore both existence and uniqueness of the...
متن کاملExploring Tensor Rank
We consider the problem of tensor rank. We define tensor rank, discuss the motivations behind exploring the topic, and give some examples of the difficulties we face when trying to compute tensor rank. Some simpler lower and upper bounds for tensor rank are proven, and two techniques for giving lower bounds are explored. Finally we give one explicit example of a construction of an n×n×n tensors...
متن کاملTensor Rank is NP-Complete
We prove that computing the rank of a three-dimensional tensor over any nite eld is NP-complete. Over the rational numbers the problem is NP-hard. Warning: Essentially this paper has beenpublished in Journal of Algorithms and is hence subject to copyright restrictions. It is for personal use only. 1. Introduction. One of the most fundamental quantities in linear algebra is the rank of a matrix....
متن کاملProvable Low-Rank Tensor Recovery
In this paper, we rigorously study tractable models for provably recovering low-rank tensors. Unlike their matrix-based predecessors, current convex approaches for recovering low-rank tensors based on incomplete (tensor completion) and/or grossly corrupted (tensor robust principal analysis) observations still suffer from the lack of theoretical guarantees, although they have been used in variou...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Machine Learning
سال: 2021
ISSN: ['0885-6125', '1573-0565']
DOI: https://doi.org/10.1007/s10994-021-05987-8